Isometry-enforcing Data Transformations for Improving Sparse Model Learning

نویسندگان

  • Avishy Carmi
  • Irina Rish
  • Guillermo Cecchi
  • Dimitri Kanevsky
  • Bhuvana Ramabhadran
چکیده

Imposing sparsity constraints (such as l1-regularization) on the model parameters is a practical and efficient way of handling very high-dimensional data, which also yields interpretable models due to embedded feature-selection. Compressed sensing (CS) theory provides guarantees on the quality of sparse signal (in our case, model) reconstruction that relies on the so-called restricted isometry property (RIP) of the sensing (design) matrices. This, however, cannot be guaranteed as these matrices form a subset of the underlying data set. Nevertheless, as we show, one can find a distance-preserving linear transformation of the data such that any transformed subspace of the data satisfies the RIP at some level. We demonstrate the effects of such RIP-enforcing data transformation on sparse learning methods such as sparse and compressed Random Fields, as well as sparse regression (LASSO), in the context of classifying mental states based on fMRI data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Structured Principal Component Analysis and Model Learning for Classification and Quality Detection of Rice Grains

In scientific and commercial fields associated with modern agriculture, the categorization of different rice types and determination of its quality is very important. Various image processing algorithms are applied in recent years to detect different agricultural products. The problem of rice classification and quality detection in this paper is presented based on model learning concepts includ...

متن کامل

Sparse Recovery on Euclidean Jordan Algebras

We consider the sparse recovery problem on Euclidean Jordan algebra (SREJA), which includes sparse signal recovery and low-rank symmetric matrix recovery as special cases. We introduce the restricted isometry property, null space property (NSP), and s-goodness for linear transformations in s-sparse element recovery on Euclidean Jordan algebra (SREJA), all of which provide sufficient conditions ...

متن کامل

Image Classification via Sparse Representation and Subspace Alignment

Image representation is a crucial problem in image processing where there exist many low-level representations of image, i.e., SIFT, HOG and so on. But there is a missing link across low-level and high-level semantic representations. In fact, traditional machine learning approaches, e.g., non-negative matrix factorization, sparse representation and principle component analysis are employed to d...

متن کامل

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumpt...

متن کامل

Gap Safe Screening Rules for Sparsity Enforcing Penalties

In high dimensional regression settings, sparsity enforcing penalties have proved useful to regularize the data-fitting term. A recently introduced technique called screening rules propose to ignore some variables in the optimization leveraging the expected sparsity of the solutions and consequently leading to faster solvers. When the procedure is guaranteed not to discard variables wrongly the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009